# Multilingual Optimization
Suzume Llama 3 8B Multilingual Orpo Borda Half
A multilingual large model fine-tuned via the ORPO method based on Llama-3-8B, trained with 50% of the most consistent ranking data, demonstrating excellent performance in various language tasks.
Large Language Model
Transformers

S
lightblue
4,625
16
Litlat Bert
LitLat BERT is a trilingual model based on the xlm-roberta-base architecture, focusing on Lithuanian, Latvian, and English performance.
Large Language Model
Transformers Supports Multiple Languages

L
EMBEDDIA
937
5
Featured Recommended AI Models